How the U.S. Can Counter Disinformation From Russia and China

How the U.S. Can Counter Disinformation From Russia and China

At a March 2022 press conference, Russian Foreign Minister Sergei Lavrov accuses the United States of conducting research in Ukraine to develop biological weapons.
At a March 2022 press conference, Russian Foreign Minister Sergei Lavrov accuses the United States of conducting research in Ukraine to develop biological weapons. Kay Nietfeld/Getty Images

Attempts by Russia, China, and other U.S. adversaries to spread dangerous false narratives need to be countered before they take root. 

August 14, 2024 1:19 pm (EST)

At a March 2022 press conference, Russian Foreign Minister Sergei Lavrov accuses the United States of conducting research in Ukraine to develop biological weapons.
At a March 2022 press conference, Russian Foreign Minister Sergei Lavrov accuses the United States of conducting research in Ukraine to develop biological weapons. Kay Nietfeld/Getty Images
Expert Brief
CFR scholars provide expert analysis and commentary on international issues.

Dana S. LaFon is the 2023–24 National Intelligence Fellow at CFR.

More From Our Experts

Disinformation campaigns can be a powerful tool to shape beliefs on matters of great geopolitical importance. Bad actors can deploy them against rivals to sow costly discord, create political uncertainty, and deepen divides within a community. Monitoring and “pre-bunking” even the most obscure claims is important because, if left unaddressed, their damage can be hard to undo, and in some cases, those false narratives can presage a real-life attack.

A Recipe for Disinformation

More on:

Russia

Influence Campaigns and Disinformation

China

Defense and Security

United States

There are three steps to building an effective disinformation campaign: 1) craft an influential false narrative around an egregious lie; 2) amplify the false narrative across various channels using influence principles; and 3) obfuscate the origins of the lie.

A prime example is the Russian government’s false narrative that the United States has been developing bioweapons in Ukraine for years. Importantly, this narrative was among the earliest indicators that Russia intended to invade Ukraine. A 2022 Microsoft report [PDF] found that Russian disinformation operatives “pre-positioned” the false claim in November 2021, when it was featured on a YouTube channel operated by an American based in Moscow. When Russia invaded Ukraine three months later, Kremlin-operated news sites such as RT and Sputnik News referred to the pre-positioned report as an authoritative account that justified Russia’s invasion. This narrative has been debunked repeatedly, including by NewsGuard, a U.S.-based media watchdog whose analysts are specially trained to identify the spreading of false information. This disinformation campaign is similar to one the Soviet Union employed in 1980’s, which claimed that the United States developed HIV/AIDS as a bioweapon.

Build the False Narrative

A disinformation campaign relies on a false narrative that surrounds a lie with a sense of truth and taps into existing divides within a targeted community, which could be divisions over geopolitical issues, socioeconomic differences, or any theme that resonates. The narrative sounds believable because it has truthful elements and is associated with real-world events or people of authority. The false claim can resound with feelings of marginalization in the targeted audience.

More From Our Experts

In the Russian disinformation example, the kernel of truth is that the United States does help Ukraine and other former Soviet republics make their former Kremlin-operated labs safe, under the Biological Threat Reduction Program [PDF]. Someone who encounters the campaign does not need to know how biochemical weapons are made or how U.S. policy addresses them because the false narrative explains these matters in a compelling way. The lie, that the United States is developing bioweapons in Ukraine, then shifts unconscious beliefs in the target and, ideally, triggers behaviors that favor the Russian government, such as protesting U.S. weapons development in Ukraine, sending money to support a protest, or simply reposting the false narrative.

Amplify the Narrative

After the narrative is planted, it is essential that sources trusted by the target audience amplify it. These can include internet forums, social media websites, news sources, and false personas operated by Russia or their supporters and proxies. In addition, there can be unsuspecting yet credible spokespeople, deemed “useful idiots” in the disinformation literature.

More on:

Russia

Influence Campaigns and Disinformation

China

Defense and Security

United States

Amplification occurs through restatement and variation. For example, NewsGuard has identified 200 false claims about the Russia-Ukraine war across 473 websites. In addition to the core false claim about U.S. operated bioweapons labs in Ukraine, other fabrications said the United States developed bioweapons to target ethnic Russians; that North Atlantic Treaty Organization (NATO) advisors were hiding out in a bioweapons lab underneath a steel plant in Mariupol, Ukraine; and that Ukraine conducted infectious disease experiments on its military personnel in U.S.-run biological laboratories.

Obfuscate the Source

Successfully spreading disinformation requires obscuring the provenance of the false narrative. Obfuscation is helped by numerous sources repeating the false claim, often with variations to it. Repeatability plus specificity equals believability. Thus, by ensuring the false narrative is repeated by diverse sources, including “useful idiots,” often with false granular detail, and that organic sharing or reposting occurs as well, the lie eventually rings true to its audience.

The repeatability of the narrative makes tracing its true source difficult. Hundreds of Russian-sourced online statements, retweets, posts, and news reports all circle back to each other. It is virtually impossible for the average consumer to source the origin of these claims or understand how they spread.

To reinforce the believability of a false narrative, its originators leverage influence principles and unconscious bias. Decades of research demonstrate that these tactics are fundamentally effective and difficult to thwart. They unconsciously bind the repeated narrative to audiences’ beliefs, which leads to changes in behavior as the audiences will naturally act in ways consistent with their beliefs, particularly if they have articulated or documented these beliefs [PDF] by reposting them on social media. Such behavior change is the ultimate goal of the false narrative author.

How to ‘Pre-bunk’

“Pre-bunking” a narrative, detecting it before it is amplified, combined with increased influence immunity is the most powerful way to prevent a disinformation campaign from taking hold in the first place. Once initiated, strong disinformation campaigns are difficult to counter. However, social science research demonstrates that the early countering of a false narrative is more likely to be effective if it provides the targeted audience an alternative, true narrative. This narrative should be detailed, remind the audience of the false narrative it is correcting, and be repeated, much like the amplification of a false narrative. Research shows that repeating the false narrative does not reinforce audiences’ belief in the disinformation. Additionally, making people aware of their vulnerability to false narratives and of the originator’s nefarious motivations can increase the effectiveness of debunking efforts.

A New Warning From China?

China is quickly catching up to Russia as an effective proliferator of disinformation. In April 2023, NewsGuard analysts spotted a false claim about a supposed U.S. bioweapons lab in Kazakhstan in a video created by China Daily, the Beijing-controlled English-language publication. The professionally produced video accused the United States of operating the laboratory to conduct secret research on the transmission of viruses to Chinese people from camels. Much of the purported “evidence” in the video was based on unsubstantiated claims first propagated by Russian disinformation websites that had stated that mysterious “mass deaths have happened” in Kazakhstan.

This accusation echoes the Russian claim about U.S. labs that was pre-positioned on YouTube before Russia invaded Ukraine. Its kernel of truth is that the United States and Kazakhstan are working to eliminate bioweapons labs in the former Soviet Republic as part of a 1995 agreement to destroy infrastructure [PDF] used to create weapons of mass destruction. This Chinese disinformation appears to be a pre-positioned false claim as well. Policymakers in the United States and allied countries should take heed of this effort by Beijing and what it could augur.

It is unclear why China would create this false narrative, but it meets the criteria of an advanced strategic warning of an incoming disinformation campaign. Explaining the true situation in a way that resonates with the target audience could be the best way to undermine the false narrative before it sows discord, creates uncertainty, or deepens community divides. Examining and pre-bunking obfuscated false claims as early as possible is essential in countering disinformation, as these early false narratives could serve as indicators of cyber or physical attacks to come.

This work represents the views and opinions solely of the author. The Council on Foreign Relations is an independent, nonpartisan membership organization, think tank, and publisher, and takes no institutional positions on matters of policy.

Creative Commons
Creative Commons: Some rights reserved.
Close
This work is licensed under Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) License.
View License Detail
Close

Top Stories on CFR

Asia

Terrorism and Counterterrorism

Violence around U.S. elections in 2024 could not only destabilize American democracy but also embolden autocrats across the world. Jacob Ware recommends that political leaders take steps to shore up civic trust and remove the opportunity for violence ahead of the 2024 election season.

China

Those seeking to profit from fentanyl and governments seeking to control its supply are locked in a never-ending competition, with each new countermeasure spurring further innovation to circumvent it.